Comparison of maximum entropy and minimal mutual information in a nonlinear setting
نویسندگان
چکیده
In blind source separation (BSS), two di.erent separation techniques are mainly used: minimal mutual information (MMI), where minimization of the mutual output information yields an independent random vector, and maximum entropy (ME), where the output entropy is maximized. However, it is yet unclear why ME should solve the separation problem, i.e. result in an independent vector. Yang and Amari have given a partial con6rmation for ME in the linear case in [18], where they prove that under the assumption of vanishing expectation of the sources ME does not change the solutions of MMI except for scaling and permutation. In this paper, we generalize Yang and Amari’s approach to nonlinear BSS problems, where random vectors are mixed by output functions of layered neural networks. We show that certain solution points of MMI are kept 6xed by ME if no scaling in all layers is allowed. In general, ME, however, might also change the scaling in the non-output network layers, hence, leaving the MMI solution points. Therefore, we conclude this paper by suggesting that in nonlinear ME algorithms, the norm of all weight matrix rows of each non-output layer should be kept 6xed in later epochs during network training. ? 2002 Elsevier Science B.V. All rights reserved.
منابع مشابه
Maximum Entropy and Minimal Mutual Information in a Nonlinear Model
In blind source separation, two different separation techniques are mainly used: Minimal Mutual Information (MMI), where minimization of the mutual output information yields an independent random vector, and Maximum Entropy (ME), where the output entropy is maximized. However, it is yet unclear why ME should solve the separation problem, ie. result in an independent vector. Amari has given a pa...
متن کاملInformation Entropy Suggests Stronger Nonlinear Associations between Hydro-Meteorological Variables and ENSO
Understanding the teleconnections between hydro-meteorological data and the El Niño–Southern Oscillation cycle (ENSO) is an important step towards developing flood early warning systems. In this study, the concept of mutual information (MI) was applied using marginal and joint information entropy to quantify the linear and non-linear relationship between annual streamflow, extreme precipitation...
متن کاملMinimal Models of Multidimensional Computations
The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output ...
متن کاملPlant Classification in Images of Natural Scenes Using Segmentations Fusion
This paper presents a novel approach to automatic classifying and identifying of tree leaves using image segmentation fusion. With the development of mobile devices and remote access, automatic plant identification in images taken in natural scenes has received much attention. Image segmentation plays a key role in most plant identification methods, especially in complex background images. Wher...
متن کاملQuadratic Mutual Information Feature Selection
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous dat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Signal Processing
دوره 82 شماره
صفحات -
تاریخ انتشار 2002